AI systems must be able to reason about complex objects as well explicitly handle uncertainty. First order logic gives the formalism to handle the first. Probability gives the power to handle the latter. Combining the two has been a long standing goal of AI research. In this talk, I will present Markov Logic (Richardson & Domingos 06), which combines the power of full first order logic and Markov networks. Markov logic represents the underlying world by attaching real valued weights to formulas in first order logic. The formulas in Markov logic can be seen as defining templates for ground Markov networks. Carrying out propositional inference techniques in such models leads to explosion in time and memory. To overcome these problems, I will present the first algorithm for lifted probabilistic inference with results on real data: lifted belief propagation. Learning of the parameters (formula weights) is done using a voted perceptron algorithm. I will then present applications to the problems of entity resolution and identification of social relationships in consumer photo collections. I will conclude the talk with directions for future work.
Uploaded by chris85 on